1.2K Downloads

mistralai/
mistral-nemo-i...
12B
mistral

A slightly larger 12B parameter model from Mistral AI, NeMo offers a long 128k token context length, advanced world knowledge, and function calling for developers.

Tool use

Last Updated   9 days ago

Min.7GB
README

Mistral Nemo Instruct 2407 by mistralai

Mistral Nemo was trained up to 128k context, but supports extra with potentially reduced quality.

This model has amazing performance across a series of benchmarks including multilingual.

For more details, check the blog post here: https://mistral.ai/news/mistral-nemo/

sources

The underlying model files this model uses

When you download this model, LM Studio picks the source that will best suit your machine (you can override this)

config

Custom configuration options included with this model

No custom configuration.